W11A. Peer-Reviewing the Methodology Section
1. Summary
1.1 Purpose of Peer-Reviewing a Methodology
Peer review of the Methodology section serves the same dual function it performs for any academic writing: it gives the author targeted, constructive feedback before final submission, and it trains the reviewer to recognize what a well-designed, well-argued methodology looks like. Because the Methodology is an argument—not merely a list of techniques—it must be evaluated by a reader who asks whether the design choices are justified, coherent, and connected to the stated research objectives.
The Methodology peer-review session directly follows the first draft submission. Authors receive feedback while revision is still possible, and reviewers practice the critical reading skills that transfer to evaluating their own writing.
1.2 The Four-Stage Peer-Review Procedure
The session is structured in four sequential stages that move from independent reading to collaborative discussion.
Stage 1 — Group assignment. Each research group decides, together, which other group’s Methodology section it will review. This assignment is made at the start of the session and determines each reviewer’s workload for Stage 2.
Stage 2 — Individual reading. Each student works individually. The assigned Methodology section is read three times, and the evaluation table in Handout 1 is completed during this process.
Three readings are required because a single pass rarely surfaces all the strengths and weaknesses of a structured document: - The first reading establishes a general impression and checks whether the overall research design makes sense. - The second reading examines specific claims: Are the seven methodology questions answered? Are choices justified? Are limitations acknowledged? - The third reading confirms earlier observations and checks for patterns—for example, whether justification is consistently missing, or whether the same question is addressed multiple times in contradictory ways.
Critical constraint: Students must not communicate with one another during Stage 2. Each reviewer must form independent conclusions based solely on the text. Peer consultation at this stage introduces anchoring—the tendency for early opinions to shape everyone else’s assessment before they have read carefully—and undermines the value of having multiple independent readers.
Stage 3 — Group discussion. Once all individuals have completed their evaluation tables, the reviewing group reconvenes. Members compare their assessments, discuss where they agree and where they diverge, and reconcile their evaluations into a shared view. Disagreements are productive: they often reveal genuinely ambiguous passages or design choices that could be interpreted in more than one way.
Stage 4 — Cross-group dialogue. The reviewing group meets with the group whose Methodology was reviewed. Feedback is delivered and discussed directly. The authors may ask clarifying questions; the reviewers explain their reasoning. Both sides benefit: authors learn how their design is understood by an external reader, and reviewers consolidate their analytical skills by defending specific assessments.
1.3 What to Evaluate in a Methodology
When completing the evaluation table in Handout 1, a reviewer should systematically check whether the Methodology section addresses each of the seven planning questions covered in W10B, and whether the writing meets the quality standards established in the course.
Specific questions to ask include:
- Research approach: Is an inductive or deductive approach (or a combination) specified? Is the choice explained and connected to the research objective?
- Practical considerations: Does the Methodology acknowledge constraints of time, data access, or researcher skill? Are these constraints addressed rather than ignored?
- Data collection: Are the data collection methods described precisely? Is the choice (survey, interview, questionnaire, observation) justified?
- Secondary data: If secondary data is used, is it identified and explained? Is the analytical contribution of the study clear?
- Sampling method: Is a specific sampling strategy named and justified? Is the rationale for choosing probability or non-probability sampling explained?
- Sample size: Is the sample size specified or estimated? Is there a rationale based on research objectives, required precision, or a sample size calculator?
- Data analysis: Is the analysis method (quantitative, qualitative, or mixed) specified and justified?
Beyond the seven questions, evaluate the quality of the argumentation:
- Are choices explained, or merely listed?
- Are limitations acknowledged, with a corresponding plan to address them?
- Is the Methodology coherent—do all the pieces fit together into a single, consistent design?
- Is the writing specific—does the reader know exactly what the researchers will do, or are descriptions vague?
1.4 Giving Constructive Feedback
Effective feedback on a Methodology is specific, evidence-based, and actionable. Avoid general evaluations such as “the methods section is incomplete” or “the sampling is well described.” Instead, point to specific passages and explain what they do or fail to do.
A useful feedback structure:
- Identify the specific location (e.g., the sampling subsection, the third paragraph).
- Describe what you observe (e.g., “a stratified sampling method is named but there is no explanation of how the strata were defined”).
- Explain why this is a problem (e.g., “the reader cannot assess whether the strata capture the relevant variation in the population”).
- Suggest a revision direction (e.g., “add one sentence that defines the stratification criteria and explains why those criteria are appropriate for the research question”).
This structure ensures that feedback gives the author something concrete to act on, rather than leaving them to guess what needs to change.
1.5 Homework: Revise Based on Feedback
Following the peer-review session, each group should amend their Methodology section in response to the feedback received. No formal submission is required for this revision, but the improvements should be incorporated into the final proposal draft.